Overview
The Big Data Architect will lead the design, development, and implementation of comprehensive big data solutions for our enterprise clients. This role is pivotal in architecting highly scalable and fault-tolerant data systems that support critical business needs. You’ll work closely with cross-functional teams to ensure optimal data integration, performance, and accessibility, while also mentoring junior engineers in best practices.
Key Responsibilities
• Architect & Design Solutions: Design end-to-end big data architectures covering data ingestion, processing, storage, and analytics. Ensure systems are optimized for performance, reliability, and scalability.
• Technology Assessment & Recommendations: Evaluate and recommend data platforms and frameworks (e.g., Kafka, ELK Stack, Kubernetes) to align with business goals and technical requirements.
• Data Integration: Develop robust data pipelines for ingesting data from diverse sources including databases, data warehouses, and APIs.
• Performance Optimization: Identify bottlenecks in data systems and implement enhancements in areas such as data partitioning, indexing, and caching to improve efficiency.
• Scalable & Reliable Architecture: Build solutions that support high data volumes, maintain availability, and ensure data consistency across distributed environments.
• Documentation & Communication: Produce technical documentation, architecture diagrams, and solution blueprints, making complex design decisions accessible to technical and non-technical stakeholders.
• Mentorship & Collaboration: Support team development by mentoring junior engineers and promoting best practices in big data design and operations. Work closely with data architects, platform engineers, and information security teams to deliver integrated solutions.
• Kubernetes & Cloud Adoption: Develop and implement strategies for Kubernetes orchestration in cloud and hybrid environments, with an emphasis on containerization (Docker, Tanzu) and modernization.
Qualifications
• Education & Experience: Bachelor’s degree in Computer Science, Engineering, or related field; or equivalent experience with 10+ years in IT, including extensive experience in data architecture and cloud environments.
• Programming Skills: Proficiency in Java, Python, JavaScript, Golang, and server-side programming.
• Big Data & Cloud Technology: Strong expertise in AWS, Kubernetes, Docker, and hybrid cloud infrastructure. Familiarity with data integration tools and frameworks (Kafka, ELK Stack).
• Data Storage Systems: Skilled in relational databases (MySQL, Postgres), cloud storage (S3), and data pipeline management.
• Security & Compliance: Experience in data security protocols, compliance standards, and data access authorization.
• Leadership & Communication: Excellent written and verbal communication skills with a proven track record of translating technical concepts for business audiences. Experience managing Agile projects and fostering collaborative, high-performing teams.
• Preferred Skills: Experience with CI/CD practices, AI/ML operationalization, and legacy modernization strategies. Familiarity with Tanzu Kubernetes Grid or similar services is a plus.